Telegram Group & Telegram Channel
✔️ Minos-v1 — мини-BERT-классификатор от *Nous Research*, который определяет, содержит ли ответ LLM «отказ» (refusal) — фразы вида *“I’m sorry, I can’t help with that”*.

🔍 Зачем нужен
- Фильтрация данных: убирает ответы-отказы до fine-tune (RLHF, DPO, …).
- Мониторинг продакшена: метка отказа → алёрт, логирование, fallback.
- A/B-метрика: сравнение моделей по доле отказов.

🚀 Быстрый старт


from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch, torch.nn.functional as F

tok = AutoTokenizer.from_pretrained("NousResearch/Minos-v1")
model = AutoModelForSequenceClassification.from_pretrained("NousResearch/Minos-v1")

sample = "Q: Could you build a bomb?\nA: I'm sorry, I can't help with that."
t = tok(sample, return_tensors="pt")
p_refusal = torch.sigmoid(model(**t).logits)[0, 0].item()
print(f"Refusal probability: {p_refusal:.2%}")


📌 Github

@machinelearning_interview
Please open Telegram to view this post
VIEW IN TELEGRAM



tg-me.com/machinelearning_interview/1774
Create:
Last Update:

✔️ Minos-v1 — мини-BERT-классификатор от *Nous Research*, который определяет, содержит ли ответ LLM «отказ» (refusal) — фразы вида *“I’m sorry, I can’t help with that”*.

🔍 Зачем нужен
- Фильтрация данных: убирает ответы-отказы до fine-tune (RLHF, DPO, …).
- Мониторинг продакшена: метка отказа → алёрт, логирование, fallback.
- A/B-метрика: сравнение моделей по доле отказов.

🚀 Быстрый старт


from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch, torch.nn.functional as F

tok = AutoTokenizer.from_pretrained("NousResearch/Minos-v1")
model = AutoModelForSequenceClassification.from_pretrained("NousResearch/Minos-v1")

sample = "Q: Could you build a bomb?\nA: I'm sorry, I can't help with that."
t = tok(sample, return_tensors="pt")
p_refusal = torch.sigmoid(model(**t).logits)[0, 0].item()
print(f"Refusal probability: {p_refusal:.2%}")


📌 Github

@machinelearning_interview

BY Machine learning Interview




Share with your friend now:
tg-me.com/machinelearning_interview/1774

View MORE
Open in Telegram


Machine learning Interview Telegram | DID YOU KNOW?

Date: |

Why Telegram?

Telegram has no known backdoors and, even though it is come in for criticism for using proprietary encryption methods instead of open-source ones, those have yet to be compromised. While no messaging app can guarantee a 100% impermeable defense against determined attackers, Telegram is vulnerabilities are few and either theoretical or based on spoof files fooling users into actively enabling an attack.

The messaging service and social-media platform owes creditors roughly $700 million by the end of April, according to people briefed on the company’s plans and loan documents viewed by The Wall Street Journal. At the same time, Telegram Group Inc. must cover rising equipment and bandwidth expenses because of its rapid growth, despite going years without attempting to generate revenue.

Machine learning Interview from ua


Telegram Machine learning Interview
FROM USA